Web Survey Bibliography
When propensity to respond to a survey is correlated with key survey variables, nonresponse bias can occur. One method of assessing nonresponse bias is to compare respondents with nonrespondents using auxiliary variables from the drawn sample. A limitation of this method is that many frames have only basic demographic variables, which may be poorly correlated with
response propensity. However, for low incidence and hard-to-reach populations, recontact studies are a popular option, often utilizing rich sampling frames containing behavioral and attitudinal variables from previous surveys. This paper assesses the impact of unit nonresponse in a recontact study of young adults who had recently completed a similar 'seed' study. Both studies were sponsored by the U.S. Department of Defense; the initial study examined attitudes and behaviors pertaining to military recruiting, and the recontact study assessed the awareness of and attitudes toward the Military's advertising campaigns. The seed study consisted of three iterations of a national mail survey of young adults ages 16 to 24, sampled from an address list database which covered more than 90% of the target population. Respondents to the seed study who provided an email address were used as a sampling frame for the recontact study, which was completed online. Using auxiliary variables from the original frame and from responses to the seed study, we examine unit nonresponse in the recontact study to assess differences between respondents and nonrespondents and the impact on key survey estimates. First, we compare characteristics of respondents and nonrespondents on a variety of demographic, attitudinal, and behavioral measures. Where characteristics differ significantly between the two groups, we conduct regression analysis to determine whether these characteristics also significantly predict responses to survey questions in the recontact study. After examining the impact of unit nonresponse, we discuss implications for future research.
Conference Homepage (abstract)
Web survey bibliography (4086)
- Use of Smart Phones/Text Messaging to Increase Response Rates; 2013; DuBray, P.
- Designing Surveys for Tablets and Smartphones; 2013; Lakhe, S., Nichols, E. M., Olmsted, M. G., King, T.
- Tablets as Data Entry Interfaces – Solving Data Cleaning and Transcription Issues During Data...; 2013; Costall, A.
- Effects of Response Format on Measurement of Readership; 2013; Thomas, R. K., Cobb, C. L., Baim, J.
- Potential Impact of Modifying the Fielding Time of a Web-Based Survey; 2013; Baum, H. M., Chandonnet, A.
- How Representative are Google Consumer Surveys?: Results From an Analysis of a Google Consumer Survey...; 2013; Krishnamurty, P., Tanenbaum, E., Stern, M. J.
- One Drink or Two: Does Quantity Depicted in an Image Affect Web Survey Responses?; 2013; Charoenruk, N., Stange, M.
- A Comparison Between Screen/Follow Item Format and Yes/No Item Format on a Multi-Mode Federal Survey; 2013; Hernandez,S. J., Arakelyan, S. N., Welch, V. E.
- Using Multiple Modes in Follow-Up Contacts in Random-Digit Dialing Surveys; 2013; Chowdhury, P. P.
- Tablets and Smartphones and Netbooks, Oh My! Effects of Device Type on Respondent Behavior; 2013; Ross, H., Mendelson, J., Lackey, M.
- Impacts of Unit Nonresponse in a Recontact Study of Youth; 2013; Mendelson, J., Viera Jr., L.
- Multi-Mode Survey Administration: Does Offering Multiple Modes at Once Depress Response Rates?; 2013; Newsome, J., Levin, K., Langetieg, P., Vigil, M., Sebastiani, M.
- Responsive Design for Web Panel Data Collection; 2013; Bianchi, A., Biffignandi, S.
- Utilizing the Web in a Multi-Mode Survey; 2013; Venkataraman, L.
- Changing to a Mixed-Mode Design: The Role of Mode in Respondents' Decisions About Participation...; 2013; Collins, D., Mitchell, Ma., Toomes, M.
- Comparing the Effects of Mode Design on Response Rate, Representativeness, and Cost Per Complete in...; 2013; Tully, R.
- Internet Response for the Decennial Census – 2012 National Census Test; 2013; Reiser, C.
- The Effects of Pushing Web in a Mixed-Mode Establishment Data Collection; 2013; Ellis, C.
- The Effects of Errors in Paradata on Weighting Class Adjustments: A Simulation Study; 2013; West, B. T.
- Using Paradata to Study Response to Within-Survey Requests; 2013; Sakshaug, J. W.
- Paradata for Coverage Research ; 2013; Eckman, S.
- The smart(phone) way to collect survey data; 2013; Stapleton, C.
- Online Fundraising Essentials, Second Edition; 2013; Stevenson, S. C.
- Tips for Evaluating Online Effectiveness; 2013; Stevenson, S. C.
- The Digital Divide: The internet and social inequality in international perspective; 2013; Ragnedda, M., Muschert, G.
- Survey quality prediction system 2.0; 2013
- Speed (necessarily) doesn't kill: A new way to detect survey satisficing; 2013; Garland, P., Chen, K., Epstein, J., Suh, A.
- Practical tools for designing and weighting survey samples; 2013; Valliant, R. L., Daver, J. A., Kreuter, F.
- Paradata in web surveys; 2013; Callegaro, M.
- Incentive effects; 2013; Goeritz, A.
- A nationwide web-based freight data collection; 2013; Samimi, A., Mohammadian, A., Kawamura, K.
- The E-Interview in Qualitative Research; 2013; Bampton, R., Cowton, C., Downs, Y.
- Methodological Considerations of Qualitative Email Interviews; 2013; Nehls, K.
- Best Practice in Online Survey Research with Sensitive Topics; 2013; Kays, K., Keith, T. L., Broughal, M. T.
- Reducing Response Burden for Enterprises Combining Methods for Data Collection on the Internet; 2013; Vik, T.
- Advancing Research Methods with New Technologies; 2013; Sappleton, N.
- Data Quality in PC and Mobile Web Surveys; 2013; Mavletova, A. M.
- PDAs in socio-economic surveys: instrument bias, surveyor bias or both?; 2013; Escobal, J., Benites, S.
- Virtual research assistants: Replacing human interviewers by automated avatars in virtual worlds; 2013; Hasler, B. S., Tuchman, P., Friedman, D.
- Compared to a small, supervised lab experiment, a large, unsupervised web-based experiment on a previously...; 2013; Ryan, R. S., Wilde, M., Crist, S.
- From mixed-mode to multiple devices. Web surveys, smartphone surveys and apps: has the respondent gone...; 2013; Callegaro, M.
- Moving an established survey online – or not?; 2013; Barber, T., Chilvers, D., Kaul, S.
- An approach to selecting online respondents; 2013; Terhanian, G.
- By the Numbers: Theory of adaptation or survival of the fittest?; 2013; Cavallaro, K.
- Cyborgs vs. Monsters: Assembling Modular Surveys to Create Complete Datasets; 2013; Johnson, E. P., Siluk, L., Tarraf, S.
- Shorter Isn't Always Better; 2013; Burdein, I.
- Internet-Based Recruitment to a Depression Prevention Intervention: Lessons From the Mood Memos Study...; 2013; Morgan, A. J., Jorm, A. F., Mackinnon, A. J.
- Computer science security research and human subjects: Emerging considerations for research ethics boards...; 2013; Buchanan, E. A., Aycock, J., Dexter, S., Dittrich, D., Hvizdak, E. E.
- A standard for test reliability in group research; 2013; Ellis, J. L.
- Addressing Survey Nonresponse Issues: Implications for ATE Principal Investigators, Evaluators, and...; 2013; Welch, W. W., Barlau, A. N.